Davis and McHorse, 2013.

Palaeo Electronica.

A method for improved identification of postrcania from mammalian fossil assemblages: multivariate discriminant function analysis of camelid astragali.

Aepycamelus, one of these North American camelids

Aepycamelus, one of these North American camelids

Dimensions on astragali measured in this study

Dimensions on astragali measured in this study

library(curl) #needed to read in csv file from github
camels <- read.csv(curl("https://raw.githubusercontent.com/ilundeen/nothing-to-see-here/master/camel.csv"), header=TRUE) 
#Read in the .csv file

Linear Discriminant Function

Here we’re going to use the lda() function contained in the {MASS} package
*Note: There are other packages that run linear discriminant analyses including:
{DiscriMiner}
{mda}
{dawai}
{rrlda}
{sparsediscrim}

amy image1

amy image1

library(MASS) #lda() and qda() can both be found in this package 
camel.lda <- lda(Genus..2. ~ LM + TD + TI + TP + LL + WD + WI + LI, data=camels, prior= c(1,1,1,1,1)/5, na.action="na.omit") # Note that we're using + instead of * becuase we aren't modeling any variable interactions here
# Prior is specified to be uninformative; if not included, default prior is set by relative abundances in the training set.
camel.lda
## Call:
## lda(Genus..2. ~ LM + TD + TI + TP + LL + WD + WI + LI, data = camels, 
##     prior = c(1, 1, 1, 1, 1)/5, na.action = "na.omit")
## 
## Prior probabilities of groups:
##  Aepycamelus     Alforjas Hemiauchenia  Megatylopus   Procamelus 
##          0.2          0.2          0.2          0.2          0.2 
## 
## Group means:
##                    LM       TD       TI       TP       LL       WD
## Aepycamelus  92.03708 37.36500 49.18667 41.68917 98.84625 66.96438
## Alforjas     58.12042 27.79625 33.21125 28.46583 63.85958 42.73292
## Hemiauchenia 50.45350 23.21675 27.96225 24.00013 54.34913 35.70287
## Megatylopus  79.62783 33.81348 43.73391 37.09565 87.04217 60.14348
## Procamelus   66.68000 28.91333 35.94000 33.16167 73.38333 47.09333
##                    WI       LI
## Aepycamelus  68.94229 78.17250
## Alforjas     44.22375 49.25000
## Hemiauchenia 38.46363 41.53437
## Megatylopus  61.44304 68.53522
## Procamelus   49.16167 55.34667
## 
## Coefficients of linear discriminants:
##            LD1         LD2          LD3         LD4
## LM  0.17843479 -0.26154962  0.612728795 -0.22214235
## TD -0.08833182  0.06511872 -0.237187959 -0.37465125
## TI  0.03972081  0.28547071  0.005929533 -0.28948794
## TP -0.10287848 -0.52113821 -0.301629648 -0.04092927
## LL -0.03361356 -0.29001249 -0.180872844  0.30903264
## WD  0.08460694  0.20045633 -0.236866116  0.32094209
## WI -0.04463453  0.04310350  0.243654169  0.09190961
## LI  0.23894225  0.48890370 -0.251019220 -0.15398852
## 
## Proportion of trace:
##    LD1    LD2    LD3    LD4 
## 0.9431 0.0377 0.0149 0.0044

Displayed in this output

Call:

The formula that was fitted

Prior probabilities of groups:

The proportion of each group within our dataset. Describes the liklihood of an unknown sample belonging to any of our groups.

Group means:

Table of average values for each of our variables. Useful for determining if any of our groups seem distinctive or somewhat off.

Coefficients of linear discriminants:

Coefficients of the discriminant function. There will be # of groups - 1 linear discriminants. Here we have 5 groups so we’ll have 4 linear discriminants.

Proportion of trace:

The proportion of trace shows the proportion of variance in our data explained by the linear discriminants. In our example here, 94.31% of our variance is explained by LD1

Now we are interested in how well our model reassigns our samples to our groups.

The predict() function allows us to check how successfully the linear function we just derived classifies our data into groups.

camel.p <- predict(camel.lda)
camel.p
## $class
##   [1] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##   [6] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##  [11] Aepycamelus  Aepycamelus  Aepycamelus  Megatylopus  Aepycamelus 
##  [16] Aepycamelus  Aepycamelus  Megatylopus  Aepycamelus  Aepycamelus 
##  [21] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##  [26] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Megatylopus 
##  [31] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##  [36] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##  [41] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##  [46] Megatylopus  Aepycamelus  Aepycamelus  Hemiauchenia Hemiauchenia
##  [51] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [56] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [61] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [66] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [71] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [76] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [81] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [86] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [91] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [96] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [101] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [106] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [111] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [116] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [121] Hemiauchenia Hemiauchenia Hemiauchenia Megatylopus  Aepycamelus 
## [126] Megatylopus  Megatylopus  Megatylopus  Megatylopus  Megatylopus 
## [131] Megatylopus  Megatylopus  Megatylopus  Megatylopus  Megatylopus 
## [136] Megatylopus  Megatylopus  Megatylopus  Megatylopus  Megatylopus 
## [141] Megatylopus  Megatylopus  Megatylopus  Megatylopus  Megatylopus 
## [146] Megatylopus  Alforjas     Alforjas     Alforjas     Alforjas    
## [151] Alforjas     Alforjas     Alforjas     Alforjas     Alforjas    
## [156] Alforjas     Alforjas     Alforjas     Alforjas     Alforjas    
## [161] Alforjas     Alforjas     Alforjas     Alforjas     Alforjas    
## [166] Hemiauchenia Alforjas     Alforjas     Alforjas     Alforjas    
## [171] Procamelus   Procamelus   Procamelus   Procamelus   Procamelus  
## [176] Procamelus   Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [181] Hemiauchenia
## Levels: Aepycamelus Alforjas Hemiauchenia Megatylopus Procamelus
## 
## $posterior
##      Aepycamelus     Alforjas Hemiauchenia  Megatylopus   Procamelus
## 1   9.999704e-01 4.844915e-30 1.551687e-44 2.961259e-05 9.607793e-19
## 2   9.999878e-01 1.717864e-31 1.021650e-45 1.215902e-05 5.361055e-20
## 3   1.000000e+00 2.538645e-40 3.216243e-56 2.257359e-09 5.636456e-27
## 4   9.999998e-01 2.698264e-38 1.491667e-53 2.349458e-07 1.194647e-27
## 5   9.999840e-01 1.297946e-30 3.752229e-46 1.597350e-05 9.370914e-22
## 6   9.968717e-01 3.407794e-28 4.525622e-41 3.128339e-03 3.615614e-18
## 7   9.993273e-01 3.407916e-23 3.502462e-36 6.727203e-04 1.597663e-13
## 8   9.999995e-01 5.660789e-37 7.123536e-53 4.564793e-07 2.967987e-26
## 9   9.999395e-01 4.621318e-31 1.062022e-46 6.053038e-05 4.068939e-23
## 10  9.718730e-01 8.558465e-21 3.023656e-35 2.812697e-02 5.628379e-14
## 11  9.907861e-01 3.341846e-26 5.817952e-40 9.213877e-03 5.179408e-17
## 12  8.466563e-01 2.200975e-23 2.241242e-34 1.533437e-01 5.821150e-15
## 13  9.214822e-01 3.040378e-22 4.290575e-35 7.851783e-02 1.646814e-12
## 14  3.314466e-02 3.806282e-16 2.838102e-26 9.668553e-01 3.307543e-08
## 15  1.000000e+00 2.998566e-38 1.243793e-52 1.444728e-08 4.211514e-27
## 16  9.999979e-01 4.128674e-32 1.598954e-45 2.070713e-06 2.698287e-19
## 17  9.841773e-01 8.488111e-24 7.440040e-38 1.582267e-02 2.732087e-17
## 18  4.186575e-01 1.217537e-18 3.788419e-31 5.813425e-01 2.718426e-12
## 19  9.998583e-01 2.672689e-27 1.564714e-41 1.417059e-04 8.744445e-18
## 20  9.999839e-01 6.755463e-32 3.118857e-47 1.607494e-05 2.831473e-25
## 21  9.990397e-01 1.493142e-28 3.502684e-43 9.603445e-04 1.276271e-20
## 22  9.995367e-01 5.641398e-28 5.184960e-40 4.633498e-04 2.636280e-18
## 23  9.999826e-01 1.285321e-27 1.784796e-41 1.737729e-05 1.740111e-18
## 24  9.888925e-01 1.267646e-21 1.686873e-33 1.110746e-02 5.214603e-13
## 25  9.795862e-01 3.378327e-25 4.183345e-39 2.041380e-02 5.944626e-17
## 26  9.999955e-01 6.326949e-32 2.086717e-45 4.536273e-06 3.714380e-23
## 27  9.997259e-01 2.545608e-27 7.286249e-40 2.740878e-04 1.978811e-17
## 28  9.980127e-01 3.918939e-26 2.428176e-40 1.987276e-03 5.555670e-15
## 29  9.999984e-01 1.031963e-33 7.263821e-49 1.614898e-06 1.817173e-21
## 30  2.964244e-03 2.481835e-12 2.379289e-21 9.970357e-01 4.003052e-08
## 31  9.997722e-01 8.886177e-29 1.420803e-42 2.277788e-04 3.062552e-18
## 32  9.999932e-01 1.194180e-28 1.090744e-41 6.766740e-06 1.116083e-15
## 33  9.999954e-01 5.747254e-30 1.480114e-42 4.633990e-06 2.205885e-18
## 34  9.999951e-01 1.314488e-34 3.318921e-49 4.882879e-06 2.259321e-23
## 35  9.999933e-01 4.119459e-34 7.918727e-50 6.730331e-06 1.963192e-23
## 36  9.944174e-01 6.549469e-29 5.402559e-44 5.582584e-03 1.205835e-19
## 37  9.999995e-01 5.713730e-39 5.872208e-56 4.600350e-07 5.557007e-27
## 38  9.995681e-01 6.234291e-27 3.705627e-40 4.318577e-04 3.459639e-17
## 39  9.999509e-01 8.315521e-28 1.466432e-39 4.908144e-05 4.241122e-18
## 40  9.998209e-01 4.768070e-28 1.033489e-40 1.791148e-04 1.390129e-18
## 41  9.611418e-01 3.018534e-24 1.186891e-37 3.885816e-02 4.240409e-16
## 42  9.999959e-01 1.513868e-34 1.970481e-48 4.056217e-06 8.840115e-23
## 43  9.999097e-01 3.845152e-30 1.360622e-45 9.033376e-05 1.836124e-20
## 44  9.999812e-01 1.653777e-30 4.484338e-45 1.878450e-05 4.446176e-18
## 45  9.998382e-01 1.035170e-26 1.269699e-38 1.618110e-04 3.996706e-17
## 46  5.643835e-02 1.444382e-13 8.988535e-21 9.435616e-01 1.238173e-08
## 47  9.979631e-01 1.823145e-23 3.759512e-36 2.036872e-03 2.633911e-12
## 48  1.000000e+00 8.700716e-39 4.463512e-52 9.304585e-09 7.517745e-28
## 49  3.158663e-41 1.154398e-02 9.884559e-01 2.860928e-22 9.744667e-08
## 50  7.613699e-43 6.215998e-03 9.937839e-01 8.463408e-23 9.387805e-08
## 51  9.014835e-42 1.285433e-02 9.871456e-01 2.207806e-22 3.711289e-08
## 52  2.300245e-46 2.700665e-03 9.972993e-01 7.256526e-26 1.103074e-08
## 53  1.226134e-36 3.290832e-01 6.707435e-01 3.072346e-19 1.732953e-04
## 54  1.734107e-42 9.665540e-03 9.903343e-01 2.418185e-23 1.361696e-07
## 55  3.119836e-45 2.541181e-03 9.974588e-01 7.885918e-25 3.797379e-08
## 56  1.965234e-37 1.090908e-02 9.890505e-01 2.907317e-20 4.038711e-05
## 57  1.280745e-46 1.399654e-03 9.986003e-01 6.924089e-26 9.133898e-09
## 58  1.292874e-45 4.974239e-04 9.995026e-01 8.188209e-26 1.116458e-08
## 59  2.055651e-48 7.056397e-04 9.992944e-01 3.750725e-27 4.761709e-09
## 60  9.700363e-49 7.178530e-04 9.992821e-01 2.057971e-27 1.553498e-10
## 61  9.332908e-45 9.063085e-05 9.999093e-01 8.462997e-26 3.864146e-08
## 62  3.382356e-44 3.930179e-04 9.996069e-01 1.130156e-24 1.203394e-07
## 63  6.853719e-39 1.346714e-02 9.865319e-01 2.222356e-21 9.626364e-07
## 64  4.019802e-36 2.418813e-02 9.723969e-01 4.297559e-19 3.414996e-03
## 65  2.909333e-43 9.849687e-03 9.901500e-01 7.570491e-24 2.980833e-07
## 66  4.909913e-40 1.260561e-02 9.873940e-01 1.558741e-21 4.371844e-07
## 67  3.328044e-38 9.035879e-02 9.096353e-01 7.198566e-20 5.911532e-06
## 68  4.990087e-40 1.594826e-02 9.840504e-01 1.148170e-21 1.376069e-06
## 69  1.648481e-47 7.549776e-04 9.992450e-01 1.285764e-26 4.341199e-09
## 70  3.185208e-40 7.216042e-03 9.927827e-01 5.216654e-22 1.231133e-06
## 71  2.614839e-41 3.489572e-02 9.651033e-01 2.545465e-22 9.473745e-07
## 72  7.763610e-42 1.123984e-02 9.887599e-01 1.400511e-22 2.684084e-07
## 73  1.922832e-44 3.745614e-03 9.962543e-01 2.512281e-24 6.151467e-08
## 74  1.552502e-39 1.382754e-02 9.861599e-01 1.295530e-21 1.253946e-05
## 75  4.177382e-40 3.839396e-02 9.616029e-01 8.742789e-22 3.125964e-06
## 76  8.000697e-42 2.038261e-02 9.796171e-01 3.036938e-22 3.032869e-07
## 77  9.134725e-49 4.451668e-04 9.995548e-01 1.278139e-27 1.592993e-08
## 78  2.194899e-42 2.839613e-03 9.971595e-01 1.650161e-23 8.951744e-07
## 79  1.534604e-43 2.157400e-03 9.978422e-01 4.059954e-24 3.922163e-07
## 80  6.169540e-44 4.505124e-03 9.954948e-01 4.202270e-24 7.618489e-08
## 81  4.292244e-42 9.659713e-03 9.903402e-01 4.168338e-23 5.072180e-08
## 82  1.786377e-35 7.142673e-02 9.285124e-01 3.820806e-18 6.086283e-05
## 83  3.595879e-35 2.594912e-02 9.740019e-01 2.004186e-18 4.894340e-05
## 84  6.167949e-39 8.641752e-04 9.991351e-01 1.761492e-21 7.041617e-07
## 85  6.370821e-48 9.496445e-04 9.990504e-01 7.772693e-27 1.981942e-09
## 86  9.448946e-45 2.968526e-03 9.970314e-01 8.597890e-25 4.213738e-08
## 87  2.288181e-41 8.930957e-03 9.910689e-01 2.682768e-22 1.059415e-07
## 88  3.016217e-35 2.076962e-01 7.919794e-01 3.986638e-18 3.244463e-04
## 89  3.308794e-41 1.952415e-02 9.804759e-01 3.197554e-22 8.878747e-10
## 90  1.660605e-55 2.938622e-05 9.999706e-01 3.973187e-32 3.350098e-12
## 91  3.351038e-40 3.971185e-02 9.602880e-01 8.600887e-21 1.299389e-07
## 92  7.809404e-47 1.388971e-03 9.986110e-01 5.602638e-26 2.144930e-09
## 93  1.067347e-41 4.082692e-02 9.591730e-01 5.128479e-22 6.472396e-08
## 94  3.495565e-40 6.326476e-03 9.936732e-01 1.486173e-21 2.895273e-07
## 95  2.075851e-40 6.653172e-03 9.933460e-01 1.075570e-21 7.905739e-07
## 96  7.097837e-43 1.296138e-02 9.870386e-01 3.979685e-23 6.505865e-08
## 97  4.380311e-46 5.375687e-04 9.994624e-01 3.472361e-26 1.207663e-08
## 98  3.917714e-46 5.672314e-03 9.943277e-01 7.484719e-26 8.510860e-09
## 99  6.081755e-47 2.872532e-04 9.997127e-01 5.993873e-27 8.175713e-10
## 100 1.629139e-41 1.269903e-02 9.873010e-01 2.955772e-22 1.282685e-08
## 101 3.313216e-40 6.224058e-03 9.937756e-01 1.697332e-21 3.666385e-07
## 102 7.183497e-44 6.599263e-04 9.993401e-01 7.323132e-25 7.626642e-09
## 103 1.027827e-44 8.354272e-04 9.991646e-01 5.774903e-25 7.978117e-09
## 104 5.980932e-42 4.706396e-02 9.529360e-01 3.748933e-22 3.592334e-08
## 105 5.233834e-42 1.025621e-01 8.974379e-01 6.365676e-22 9.872398e-09
## 106 1.252153e-38 2.547841e-02 9.745209e-01 1.622665e-20 6.560682e-07
## 107 8.456719e-42 1.818939e-03 9.981809e-01 4.457180e-23 1.457624e-07
## 108 4.464091e-41 1.309280e-02 9.869067e-01 3.672847e-22 4.679410e-07
## 109 6.613295e-42 4.078998e-03 9.959207e-01 3.415471e-23 2.688247e-07
## 110 7.386150e-40 1.845814e-02 9.815414e-01 2.777932e-21 4.758537e-07
## 111 7.743576e-36 8.973569e-02 9.101683e-01 5.537188e-18 9.601490e-05
## 112 3.179400e-48 9.915161e-04 9.990085e-01 6.793027e-27 3.036605e-10
## 113 1.323714e-40 1.036107e-02 9.896368e-01 1.325311e-22 2.083214e-06
## 114 1.853313e-46 3.601099e-03 9.963989e-01 8.813901e-26 1.134745e-08
## 115 2.360744e-35 1.062618e-01 8.937176e-01 2.017879e-17 2.061084e-05
## 116 1.984319e-31 4.953528e-01 5.044240e-01 8.993629e-15 2.231931e-04
## 117 2.689026e-41 3.543619e-03 9.964564e-01 8.544390e-22 2.169917e-09
## 118 3.294378e-41 1.519489e-03 9.984805e-01 1.531278e-22 2.667304e-08
## 119 9.658341e-36 1.238155e-01 8.761784e-01 1.296609e-17 6.154220e-06
## 120 2.215479e-42 1.160872e-02 9.883908e-01 2.337251e-22 4.320740e-07
## 121 1.402317e-31 3.517850e-01 6.480698e-01 8.494206e-15 1.451924e-04
## 122 5.115513e-33 2.555010e-01 7.444845e-01 2.105657e-16 1.455108e-05
## 123 6.025586e-38 7.499463e-03 9.924994e-01 1.942974e-20 1.118505e-06
## 124 5.388233e-03 7.124665e-17 2.066512e-28 9.946118e-01 5.002372e-10
## 125 5.064916e-01 2.342418e-17 2.984182e-28 4.935084e-01 1.016891e-09
## 126 2.373835e-05 5.709888e-12 2.924187e-21 9.999762e-01 4.834147e-08
## 127 2.001900e-07 1.115401e-06 1.120088e-14 9.998539e-01 1.447848e-04
## 128 2.535479e-02 2.544419e-12 4.253156e-21 9.741963e-01 4.489232e-04
## 129 2.564816e-06 5.085328e-07 2.289625e-14 9.999685e-01 2.841175e-05
## 130 1.224552e-08 4.657967e-06 1.439181e-13 9.991802e-01 8.151517e-04
## 131 1.468865e-03 4.468622e-15 7.264291e-26 9.985311e-01 1.896528e-11
## 132 3.007936e-06 1.462358e-10 3.236933e-20 9.999963e-01 6.702526e-07
## 133 2.286600e-05 6.126402e-13 2.417466e-24 9.999771e-01 5.466475e-08
## 134 2.469074e-03 6.018013e-15 5.411001e-26 9.975309e-01 1.654127e-09
## 135 4.633080e-01 2.125144e-19 3.960439e-33 5.366920e-01 6.832852e-13
## 136 3.629386e-05 3.583129e-11 4.720791e-21 9.999444e-01 1.929279e-05
## 137 8.282796e-06 2.202815e-10 4.000926e-19 9.999912e-01 5.560342e-07
## 138 4.631509e-03 1.325165e-16 2.233547e-29 9.953685e-01 6.206525e-12
## 139 6.158568e-02 4.081173e-19 5.410255e-31 9.384143e-01 6.490373e-12
## 140 1.217931e-05 2.885628e-13 4.264610e-24 9.999878e-01 1.721591e-08
## 141 2.343171e-03 2.020134e-13 3.017892e-23 9.976564e-01 4.078593e-07
## 142 1.138823e-06 7.127442e-09 2.735245e-18 9.999937e-01 5.135563e-06
## 143 3.697504e-06 6.292640e-10 5.859615e-18 9.999957e-01 5.745918e-07
## 144 3.621620e-06 2.660634e-09 3.659842e-16 9.999778e-01 1.858063e-05
## 145 2.546308e-01 8.546845e-17 1.148367e-27 7.453692e-01 3.337056e-10
## 146 3.817344e-03 1.575492e-13 8.380221e-23 9.961825e-01 1.408148e-07
## 147 3.392227e-28 9.585887e-01 1.147467e-02 3.936894e-13 2.993664e-02
## 148 7.720373e-27 9.950458e-01 3.160377e-03 6.542372e-12 1.793798e-03
## 149 3.866181e-26 9.784145e-01 1.346470e-02 1.808914e-11 8.120756e-03
## 150 9.019656e-31 8.997853e-01 9.767832e-02 1.344859e-13 2.536346e-03
## 151 8.918218e-33 8.255261e-01 1.744719e-01 2.727409e-14 2.079869e-06
## 152 4.969479e-32 9.665775e-01 3.321080e-02 7.529640e-15 2.116892e-04
## 153 4.483072e-28 9.898630e-01 1.012559e-02 7.089732e-12 1.145585e-05
## 154 2.925602e-25 9.980096e-01 6.406513e-04 4.110445e-11 1.349740e-03
## 155 5.372361e-28 9.654799e-01 2.434858e-02 7.570110e-13 1.017149e-02
## 156 3.078141e-24 9.503800e-01 4.650927e-04 7.081112e-10 4.915489e-02
## 157 3.712390e-32 9.850819e-01 1.077938e-02 1.331494e-15 4.138678e-03
## 158 5.917755e-26 9.950974e-01 3.467912e-04 2.405039e-11 4.555801e-03
## 159 2.621449e-25 9.384130e-01 2.029116e-03 1.072417e-11 5.955788e-02
## 160 6.262371e-26 9.998301e-01 6.285331e-05 4.176824e-11 1.070031e-04
## 161 2.719705e-28 9.972538e-01 5.991341e-04 2.251328e-13 2.147108e-03
## 162 2.608487e-28 9.678712e-01 2.903900e-02 5.954483e-13 3.089832e-03
## 163 1.293099e-28 9.648031e-01 3.492675e-02 6.719037e-12 2.701021e-04
## 164 1.533602e-29 9.060450e-01 7.507726e-02 2.653080e-14 1.887770e-02
## 165 1.041524e-33 9.897859e-01 9.949296e-03 7.531293e-17 2.647839e-04
## 166 9.772394e-35 2.285891e-01 7.707833e-01 6.860435e-18 6.275310e-04
## 167 2.683994e-34 8.623258e-01 1.376260e-01 4.531440e-17 4.814337e-05
## 168 4.792984e-24 8.772098e-01 1.270115e-02 1.765347e-09 1.100891e-01
## 169 5.443102e-25 9.927115e-01 6.751208e-05 5.529889e-10 7.221032e-03
## 170 4.835442e-21 9.835318e-01 1.432224e-03 3.664513e-08 1.503592e-02
## 171 4.670639e-12 3.141584e-04 3.703818e-08 3.005677e-04 9.993852e-01
## 172 7.636454e-22 2.061364e-02 9.545914e-08 4.044623e-09 9.793863e-01
## 173 1.148094e-21 1.834947e-03 2.630538e-06 3.081541e-10 9.981624e-01
## 174 2.652394e-19 5.216881e-03 5.612678e-07 4.859240e-08 9.947825e-01
## 175 7.091411e-19 5.469252e-03 1.177136e-07 3.207263e-08 9.945306e-01
## 176 1.029773e-17 1.871275e-04 4.271650e-08 1.599252e-07 9.998127e-01
## 177 1.257061e-41 1.204115e-01 8.795871e-01 5.520025e-22 1.412759e-06
## 178 9.178163e-37 1.411299e-01 8.588699e-01 5.505490e-18 1.518752e-07
## 179 3.161769e-44 1.746215e-03 9.982538e-01 6.493048e-24 8.395166e-09
## 180 3.793476e-49 2.112674e-03 9.978873e-01 4.732750e-28 2.968742e-10
## 181 3.587879e-44 4.202734e-02 9.579727e-01 1.513333e-23 6.778392e-09
## 
## $x
##             LD1         LD2          LD3          LD4
## 1    7.88316567 -1.34812481 -0.164964364 -0.517977536
## 2    8.12247147 -1.07552470  0.433549135 -0.003859773
## 3    9.85141321 -1.27768507  1.233567419 -1.242550331
## 4    9.56070722  1.29199756  1.290937966  0.434597623
## 5    8.16437910  0.68284560 -0.354258972 -2.396851848
## 6    7.48188475  0.24652248  0.524455516  2.542795432
## 7    6.41697810 -1.57102297  0.370360798 -2.578869419
## 8    9.37803535  0.70466735  0.295903014 -0.275329852
## 9    8.33149107  1.85964741 -0.486511040 -1.413293315
## 10   6.26672407  0.36607595 -1.543698870 -3.242438469
## 11   7.22189570  0.23246544 -0.640230366  1.152497080
## 12   6.40850050  0.97246939  1.420831827  2.919448539
## 13   6.33914126 -1.14475707 -0.814352294  1.008768745
## 14   4.67295336 -0.74290281  0.288558381  1.428234049
## 15   9.35420475  0.55982498  2.754002027 -1.259064617
## 16   8.07187824 -2.13039005  1.416877812  0.003627276
## 17   6.83736566  1.74337044 -0.583747742 -1.520775540
## 18   5.64471541  0.94297293 -0.611189529 -0.776095522
## 19   7.37272801 -0.45957287 -0.066583874 -1.800737222
## 20   8.46300384  3.18496203  0.619974733 -2.405188431
## 21   7.77541357  1.63760454 -0.382131316 -0.229268064
## 22   7.29088469  0.34987775  1.955740967  1.061290235
## 23   7.34213606 -0.27093477  0.989518852 -3.346290920
## 24   6.07054679 -0.49132461  0.961668950 -1.114696182
## 25   7.07616175  0.79854287 -0.835274770  0.600838848
## 26   8.16398633  1.50147107  2.268542666 -1.387176960
## 27   7.19219528 -0.29087829  1.427626212 -0.051362390
## 28   7.16590786 -1.89017293 -1.282977842  0.430590630
## 29   8.61324093 -1.47003277 -0.009429692 -0.355044466
## 30   3.69709270  1.42789278  1.378767435 -1.023137041
## 31   7.61904031 -0.64026788  0.136609210  0.396451095
## 32   7.33997379 -3.26335417  1.105096804 -1.148875736
## 33   7.58477804 -1.51010472  2.197247987 -0.922970914
## 34   8.79317060  0.06765260  0.671272214  1.089290809
## 35   8.84257420  0.17838831 -0.470630951  0.097513048
## 36   7.92210910  0.78246578 -1.634300376  1.906761450
## 37   9.88848434  0.07370540 -0.989898867  1.110536232
## 38   7.20739104 -0.38311361  0.602509967 -0.403880484
## 39   7.15501922 -0.05983013  2.782998361 -0.814012889
## 40   7.35529106  0.25621323  1.603691941 -0.170637908
## 41   6.84064934  0.75588876 -0.549695322  0.737495599
## 42   8.67943309 -0.29532294  1.304708717  1.551019017
## 43   8.08776143  0.09337880 -0.975246319 -0.801203737
## 44   7.94733041 -2.27140754 -0.396893421 -0.090554976
## 45   6.98396859 -0.19236152  2.240505873 -0.712953286
## 46   3.90324823  1.61997024  4.076872813 -0.423030648
## 47   6.43509515 -2.53426774  0.018225172 -0.553807622
## 48   9.33167673  1.11868852  4.118945632 -0.715727651
## 49  -6.12642848  0.89240305  1.004835684 -0.232828681
## 50  -6.36978790  0.65359640  0.450175591  1.248146439
## 51  -6.20284658  1.24333470  0.776802414 -0.040997936
## 52  -7.02397477  0.24760688  0.399488374 -0.153959347
## 53  -5.43758817 -0.51102086  0.027969621 -1.207265395
## 54  -6.37427499  0.34839073  0.792321757 -0.553402924
## 55  -6.80629190  0.14560400  0.444791172  0.759540892
## 56  -5.52235943 -0.72358314  1.712045515  0.186521461
## 57  -7.03946296  0.23434439  0.509450127  0.596038932
## 58  -6.86627634  0.03719328  1.514328211  0.294311384
## 59  -7.34947829 -0.05957433  0.356208018  0.760881669
## 60  -7.37693108  1.16402026  0.696638459 -0.050557581
## 61  -6.70874378 -0.66445349  2.577755416  0.786050224
## 62  -6.60870085 -0.49073922  1.619800330  1.416365408
## 63  -5.76855627  0.33348017  1.740633836 -1.096362495
## 64  -5.32911937 -1.90229463  1.080048968  1.127357251
## 65  -6.52586487 -0.13212860  0.452738610 -0.422253621
## 66  -5.92829648  0.58983226  1.173668585 -0.051645497
## 67  -5.64035663  0.43981512  0.363999440 -0.080265662
## 68  -5.95543138  0.12642484  1.024480552 -0.272632388
## 69  -7.18772380  0.17555220  0.648509483  0.600517638
## 70  -5.98204436 -0.05892031  1.404884002 -0.117243158
## 71  -6.19698673  0.13612097  0.273177916 -0.620912268
## 72  -6.24267771  0.39008768  0.655579790  0.154535459
## 73  -6.67816125  0.18951888  0.486423300  0.552657888
## 74  -5.90258821 -0.72985809  1.113428765 -0.237851115
## 75  -6.00782295 -0.12544740  0.598775583 -0.922231223
## 76  -6.23610060  0.54710056  0.228273246  0.372182986
## 77  -7.43000943 -0.75686919  0.411388692  0.879701296
## 78  -6.35570641 -0.59554473  1.190746886  0.310234996
## 79  -6.54170316 -0.52780607  0.960782470  0.616558292
## 80  -6.59994795  0.20784909  0.605854957  0.252610851
## 81  -6.29264584  0.81274360  1.044689913 -0.785797950
## 82  -5.16031633  0.13463963  1.038912977  0.526315313
## 83  -5.10751561 -0.01826406  1.835392575  0.325951194
## 84  -5.69818412  0.07129022  2.844589364  0.807207780
## 85  -7.25869074  0.42871902  0.503278693  0.357226724
## 86  -6.74235555  0.13095668  0.678563308  0.122043520
## 87  -6.14130228  0.81993563  1.013338239  0.104778055
## 88  -5.16866977 -0.38744445  0.504503146 -0.245925018
## 89  -6.08538349  2.78331030  1.363977984 -1.523125820
## 90  -8.50886298  0.53283480  0.482828209  0.880000116
## 91  -5.91985544  1.48864094  0.338987526  0.500860827
## 92  -7.06007121  0.75894857  0.596500299  0.398481137
## 93  -6.20598994  1.31247356  0.100542736 -0.187360009
## 94  -5.92586825  0.65690568  1.397989920  0.541963565
## 95  -5.97921767  0.22645940  1.183083071  0.717101005
## 96  -6.41403006  0.75994848  0.383081086 -0.004396699
## 97  -6.96051861 -0.11917730  1.370427634  0.109707771
## 98  -7.00982285  0.44145177  0.299971565 -1.044910815
## 99  -7.08534260  0.55757723  1.791731376 -0.380101354
## 100 -6.14667113  1.69497387  1.013056904 -0.323669784
## 101 -5.92717211  0.58681639  1.326355704  0.746928413
## 102 -6.56203601  0.55664339  2.092424762 -0.334239038
## 103 -6.69416505  0.54421386  1.484104883  0.366231757
## 104 -6.24799166  1.50797759  0.023513902 -0.367437637
## 105 -6.24644325  2.18707105 -0.310186290 -0.711718645
## 106 -5.68675778  0.88989018  1.223783997 -0.283730194
## 107 -6.21200317  0.20340718  1.730508110  0.393579693
## 108 -6.11836918  0.34387084  0.804546016  0.013380459
## 109 -6.26505943  0.02643238  1.344751595 -0.195507223
## 110 -5.89816969  0.69867887  0.993380472 -0.055726153
## 111 -5.21035839  0.05502458  0.549239650  1.222462600
## 112 -7.28606268  1.13851516  0.545122011  0.211364406
## 113 -6.09224227 -0.43982643  1.286017870 -0.985115120
## 114 -7.03997026  0.30724311  0.175166189 -0.081530390
## 115 -5.09195686  0.87531275  0.671575636  1.274196738
## 116 -4.40499219  1.05223273  0.684545843  0.692586375
## 117 -6.02851565  2.39359563  1.597577683  0.935109990
## 118 -6.06947867  1.03119199  2.099015138  0.498409926
## 119 -5.14989064  1.29008249  0.599731516  1.019368713
## 120 -6.30920528  0.30701685  0.087527686  1.363497999
## 121 -4.42561680  1.19234186  0.891317236  1.074324525
## 122 -4.70950811  1.45421708  1.302040192 -0.470985388
## 123 -5.55686261  0.55065933  2.045191021  0.052414814
## 124  4.90531106  0.19331661 -1.341693142  1.640762870
## 125  5.16593914 -0.33691291  0.670941920 -0.390829361
## 126  3.36826645  1.51874847 -0.288340625  1.105806703
## 127  1.87316485  1.50080551 -0.389374368 -0.110739662
## 128  3.69704809 -2.11954222  1.130862073 -0.207284260
## 129  2.03199503  2.07879960  1.181153014 -1.070805644
## 130  1.51529511  1.28884727 -0.805129265  1.340989384
## 131  4.44498818  2.56033838 -0.226230504  0.014609343
## 132  2.99886316  1.16797506 -1.349548031  0.909072510
## 133  3.78720966  0.49259765 -2.691023930  1.059398264
## 134  4.43483180  0.84571192 -0.864741353  0.331627862
## 135  5.94381719  0.79160708 -1.802449516 -1.002549928
## 136  3.25195600 -0.51029053 -1.369445660  0.801807957
## 137  2.92368019  1.53083687 -0.042779686  0.634741377
## 138  5.03638938  1.74225678 -2.091282057 -0.208898638
## 139  5.54818895  0.60142306 -0.829563851  1.805165839
## 140  3.75509347  0.90313079 -2.195044513  1.895110647
## 141  3.95163824 -0.20830403 -0.009960962  0.393528223
## 142  2.58085829  1.31767845 -1.336535952  0.008343446
## 143  2.70871190  1.92065408  0.536803175  0.965499573
## 144  2.42017489  1.17178194  1.436668866  1.629089132
## 145  5.04263916  0.46189937  0.625276818 -0.530577579
## 146  3.94844410  0.26592697  0.820661404  0.355791054
## 147 -3.72324547 -0.54358885 -0.625233533 -0.640412321
## 148 -3.35653392  0.86557455 -0.659180643 -0.873954069
## 149 -3.31708512  0.58375319  0.040484987 -0.199965060
## 150 -4.19349312  0.47432706 -0.809581595  1.808510084
## 151 -4.47140501  3.02167823 -0.608371209  1.756906798
## 152 -4.37649829  0.86993279 -1.157097359  0.091375953
## 153 -3.53084183  2.95926116 -0.370822838  0.044507888
## 154 -2.99859137  1.09919941 -0.735515400 -1.766778474
## 155 -3.70240869  0.05303549 -0.182580056 -0.364447554
## 156 -2.79184625  0.13935047 -1.318194035  0.074368511
## 157 -4.42161319 -0.67137238 -1.691389106 -0.938892960
## 158 -3.09329972  0.49280617 -1.498771890 -1.250435353
## 159 -3.14183530 -0.47716872 -0.350209899 -1.755642068
## 160 -2.94776113  1.86082555 -1.936738005 -2.005872901
## 161 -3.57809127  0.09212330 -1.581393388 -2.317188639
## 162 -3.74953124  0.49425151 -0.111107904 -0.370867004
## 163 -3.70274027  1.86291087 -0.539777811  1.664116866
## 164 -4.07901229 -0.61825614  0.063496551 -0.826850908
## 165 -4.69051373 -0.07663725 -1.730266016 -2.022251734
## 166 -5.08981951 -0.54965095  0.586655596 -0.374799128
## 167 -4.89751612  0.74536799 -0.552606455 -1.282892623
## 168 -2.91014453  0.29482004  0.014559338  1.683803031
## 169 -2.78792635  0.66467857 -2.423681050 -0.093212568
## 170 -2.27362325  1.31661000  0.508664222 -0.576998029
## 171 -0.02925292 -0.99453309  2.088430196  0.083646695
## 172 -1.89840140 -2.07877031 -2.899651672 -0.087652985
## 173 -2.04622417 -2.50060599  0.273229568  0.165350348
## 174 -1.51885777 -1.71066058 -0.477310769  0.324765165
## 175 -1.39470835 -1.93594998 -0.809185404 -1.012129723
## 176 -1.03438684 -2.21275187  0.388501018  1.224650424
## 177 -6.25426092  0.26510233 -0.692733228 -0.438930680
## 178 -5.28131975  2.60280682  0.513933568  0.725434001
## 179 -6.57607519  1.01577908  0.927684690  1.277713511
## 180 -7.51634005  0.81047649  0.300641196 -1.495966949
## 181 -6.63260550  1.63554363 -0.435228422 -0.555221595

Class:

The first output here shows how our data were classified in our original data set

Posterior:

The posterior probabilities shows the probability of each sample being classified within one group over another. These probabilities measure the strength of each classification. If one is much greater than the others, that sample is assigned to that group with a high degree of confidence. Here, many of these probabilities are high so these data are confidently assigned to groups.

x:

This last part shows the discriminant function axis scores for each sample. These scores can be plotted to show distribution of our samples.

There are a number of ways to visualize these LD# scores

A simple way to do this is to use the ggord() function in the {ggord} package to plot our model. Super easy to use. A spin-off of ggplot and we all love ggplot so you might like this.

library(ggplot2)
library(devtools)
install_github('fawda123/ggord')
## Skipping install of 'ggord' from a github remote, the SHA1 (c92d629d) has not changed since last install.
##   Use `force = TRUE` to force installation
#check out https://github.com/fawda123/ggord/blob/master/R/ggord.R to view original code for this package
library(ggord)

At it’s base level, ggord() is pretty good. check it out if you want to learn more

ggord(camel.lda,camels$Genus..2.)

[^.]
But our data points are little big, and it’s difficult to tell how our variables are influencing our data so we’ll mess around with it a little bit.
The ggord default function is a little particular so if you want to use this package to make your own plots, you should check out the raw code raw code for the default ggord() plot to see the easiest way to customize the code.

p<-ggord(camel.lda, camels$Genus..2., txt=3, vec_ext=6, size=1 ) #adjusting text size, vector length, and size of the points
p+ ggtitle("DFA of Davis & McHorse 2013 Camelid data") + #adding a title
     theme(plot.title = element_text(lineheight=.8, face="bold"))+ #changing the title text 
  scale_color_brewer(palette="Dark2") + #changing the palette
     theme_minimal() #adding a minimal theme because I like it more 

How does this compare to a PCA plot?

library(FactoMineR) #add the package with PCA() to our library
oo <- PCA(camels[, 8:15], graph = FALSE) #running our PCA model
pca.<-ggord(oo, camels$Genus..2., txt=3, vec_ext=3, size=1 ) #plot our principal components as we did with our linear discriminants
pca.+ ggtitle("PCA of Davis & McHorse Camelid data") + theme(plot.title = element_text(lineheight=.8, face="bold"))+ scale_color_brewer(palette="Dark2") + theme_minimal()

Another nice way to visualize this data is by using the {plotly} package.

Allows us to look at 3 LD scores at the same time
The plotly package rocks and has very cool interactive visualizations

library(plotly)
## 
## Attaching package: 'plotly'
## The following object is masked from 'package:ggplot2':
## 
##     last_plot
## The following object is masked from 'package:MASS':
## 
##     select
## The following object is masked from 'package:stats':
## 
##     filter
## The following object is masked from 'package:graphics':
## 
##     layout
p<-plot_ly(camels, x=camel.p$x[,1], y=camel.p$x[,2], z=camel.p$x[,3]) %>% #indicate what your axes are
  add_markers ( #add markers and a hover label
    text=~paste(paste("LD1:",camel.p$x[,1]), paste("LD2:",camel.p$x[,2]), paste("LD3:",camel.p$x[,3]) , paste("Genus:", camels$Genus..2.), sep="<br />"), #what each element in your hover label will be and how you will separate them (if sep= "<br />" wasn't here they would all print on one long hover label)
    color=~camels$Genus..2., colors="Set1",symbol=I("circle"), size=I(6), hoverinfo="text") %>% #how you will define points (by genus), what colors you will use, the symbol and symbol size of each point, and what what info will be shown in the hover label (in this case it is what we just "text" as) 
  layout(scene=list(xaxis=list(title="LD1"),
                    yaxis=list(title="LD2"),
                    zaxis=list(title="LD3"))) #axes labels
p

We have it plotted, what now?


We are now interested in how accurately our discriminant function analysis classified our samples into our groups.
To do assess this, we need to compare our predicted groups with the actual groups.

cam <- table(camels$Genus..2., camel.p$class) #make a table of our original group assignments and predicted group assignments
cam #here the rows are original group assignments and the columns are predicted group assignments
##               
##                Aepycamelus Alforjas Hemiauchenia Megatylopus Procamelus
##   Aepycamelus           44        0            0           4          0
##   Alforjas               0       23            1           0          0
##   Hemiauchenia           0        0           80           0          0
##   Megatylopus            1        0            0          22          0
##   Procamelus             0        0            0           0          6

From here we can quantify the overall predictive accuracy of our LDA model.

sum(cam[row(cam) == col(cam)]) / sum(cam) #summing the number of samples that were classified in the prediction as in our original data divided by the total number of samples
## [1] 0.9668508

Since we’ll be doing this again, it will be useful to have function that generates a confusion matrix and calculates overall accuracy.
I didn’t write this function, I got it from and only modified it slightly but I like the output it generates.
It sure makes it easy to spit out information about the accuracy of your model (as we did above but in one line).

#this is writing a function that gives us a table that shows our predicted vs actual 
library(lattice)
confusion <- function(actual, predicted, names = NULL, printit = TRUE, prior = NULL) { #names and priors are null unless otherwise specified; R should print our output
 if (is.null(names)) 
 names <- levels(actual)
 tab <- table(actual, predicted)
acctab <- t(apply(tab, 1, function(x) x/sum(x)))
 dimnames(acctab) <- list(Actual = names, "Predicted" = names)
 if (is.null(prior)) {
 relnum <- table(actual)
 prior <- relnum/sum(relnum)
 acc <- sum(tab[row(tab) == col(tab)])/sum(tab)
 }
 else {
 acc <- sum(prior * diag(acctab))
 names(prior) <- names
 }
 if (printit)
 print(round(c("Overall accuracy" = acc, "Prior frequency" = prior),
 4))
 if (printit) {
 cat("\nConfusion matrix", "\n")
  print(round(acctab, 4))
  }
invisible(acctab)
}
confusion(camels$Genus..2., camel.p$class,prior= c(1,1,1,1,1)/5) #generating our confusion matrix for our camel lda model
##             Overall accuracy  Prior frequency.Aepycamelus 
##                       0.9663                       0.2000 
##     Prior frequency.Alforjas Prior frequency.Hemiauchenia 
##                       0.2000                       0.2000 
##  Prior frequency.Megatylopus   Prior frequency.Procamelus 
##                       0.2000                       0.2000 
## 
## Confusion matrix 
##               Predicted
## Actual         Aepycamelus Alforjas Hemiauchenia Megatylopus Procamelus
##   Aepycamelus       0.9167   0.0000       0.0000      0.0833          0
##   Alforjas          0.0000   0.9583       0.0417      0.0000          0
##   Hemiauchenia      0.0000   0.0000       1.0000      0.0000          0
##   Megatylopus       0.0435   0.0000       0.0000      0.9565          0
##   Procamelus        0.0000   0.0000       0.0000      0.0000          1

So what we just did used the resubstitution error method - so just a reminder - this is how well the samples are classified when all the samples are used to generate our models.
###But what if we want to use jackknifing?
Reminder: Jackknifed validation (or leave-out-one cross-validation) excludes one of our samples, generates a discriminant function with the remaining samples, uses this discriminant function to reclassify our excluded sample, and repeats this for each sample in our data set.
This can be done simply by indicating CV=TRUE in our lda() call.

knifed_camel <- lda(Genus..2. ~ LM + TD + TI + TP + LL + WD + WI + LI, data=camels, prior= c(1,1,1,1,1)/5, na.action="na.omit", CV=TRUE) #CV=TRUE means we're going to jackknife the cuss out of this thing
knifed_camel # show results
## $class
##   [1] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##   [6] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##  [11] Aepycamelus  Aepycamelus  Aepycamelus  Megatylopus  Aepycamelus 
##  [16] Aepycamelus  Aepycamelus  Megatylopus  Aepycamelus  Aepycamelus 
##  [21] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##  [26] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Megatylopus 
##  [31] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##  [36] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##  [41] Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus  Aepycamelus 
##  [46] Megatylopus  Aepycamelus  Aepycamelus  Hemiauchenia Hemiauchenia
##  [51] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [56] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [61] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [66] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [71] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [76] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [81] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [86] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [91] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
##  [96] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [101] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [106] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [111] Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [116] Alforjas     Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [121] Hemiauchenia Hemiauchenia Hemiauchenia Megatylopus  Aepycamelus 
## [126] Megatylopus  Megatylopus  Megatylopus  Megatylopus  Megatylopus 
## [131] Megatylopus  Megatylopus  Megatylopus  Megatylopus  Aepycamelus 
## [136] Megatylopus  Megatylopus  Megatylopus  Megatylopus  Megatylopus 
## [141] Megatylopus  Megatylopus  Megatylopus  Megatylopus  Megatylopus 
## [146] Megatylopus  Alforjas     Alforjas     Alforjas     Alforjas    
## [151] Alforjas     Alforjas     Alforjas     Alforjas     Alforjas    
## [156] Alforjas     Alforjas     Alforjas     Alforjas     Alforjas    
## [161] Alforjas     Alforjas     Alforjas     Alforjas     Alforjas    
## [166] Hemiauchenia Alforjas     Alforjas     Alforjas     Alforjas    
## [171] Procamelus   Procamelus   Procamelus   Procamelus   Procamelus  
## [176] Procamelus   Hemiauchenia Hemiauchenia Hemiauchenia Hemiauchenia
## [181] Hemiauchenia
## Levels: Aepycamelus Alforjas Hemiauchenia Megatylopus Procamelus
## 
## $posterior
##      Aepycamelus     Alforjas Hemiauchenia  Megatylopus   Procamelus
## 1   9.999649e-01 6.559234e-30 2.044637e-44 3.510405e-05 1.533861e-18
## 2   9.999865e-01 1.601968e-31 9.423372e-46 1.354178e-05 6.903964e-20
## 3   1.000000e+00 3.111942e-42 9.947166e-59 7.555982e-10 4.875788e-28
## 4   9.999998e-01 2.351039e-39 6.745386e-55 1.811409e-07 1.417319e-28
## 5   9.999825e-01 1.638383e-30 3.921160e-46 1.745387e-05 9.073691e-22
## 6   9.952176e-01 6.255639e-28 1.080748e-40 4.782434e-03 6.111753e-18
## 7   9.985970e-01 2.641379e-22 3.497555e-35 1.403047e-03 1.089304e-12
## 8   9.999996e-01 5.718942e-38 2.517187e-54 3.529948e-07 4.308590e-27
## 9   9.999252e-01 5.048689e-31 8.054680e-47 7.478916e-05 2.504907e-23
## 10  8.605713e-01 2.662432e-19 9.393703e-34 1.394287e-01 7.753889e-13
## 11  9.854172e-01 7.450862e-26 1.528825e-39 1.458280e-02 9.800921e-17
## 12  6.279045e-01 9.710716e-23 1.390348e-33 3.720955e-01 1.922291e-14
## 13  8.786958e-01 6.069400e-22 9.629517e-35 1.213042e-01 2.885461e-12
## 14  1.055598e-02 2.531958e-16 8.329366e-27 9.894440e-01 2.753146e-08
## 15  1.000000e+00 2.199891e-39 7.251675e-54 7.719679e-09 5.387779e-28
## 16  9.999981e-01 3.259343e-32 1.579617e-45 1.944653e-06 3.998732e-19
## 17  9.463178e-01 7.426431e-23 7.025875e-37 5.368218e-02 9.321648e-17
## 18  2.716876e-01 1.319141e-18 3.991644e-31 7.283124e-01 3.879389e-12
## 19  9.998340e-01 4.525974e-27 3.076968e-41 1.660392e-04 1.297060e-17
## 20  9.999797e-01 3.631591e-32 7.752128e-48 2.029205e-05 2.406217e-26
## 21  9.987303e-01 2.474663e-28 6.093592e-43 1.269662e-03 1.475981e-20
## 22  9.993455e-01 1.079569e-27 1.399204e-39 6.545030e-04 4.437397e-18
## 23  9.999794e-01 3.111583e-27 4.860840e-41 2.058051e-05 3.298095e-18
## 24  9.800802e-01 3.659507e-21 5.114399e-33 1.991980e-02 1.369543e-12
## 25  9.736372e-01 5.901171e-25 8.788819e-39 2.636280e-02 9.195154e-17
## 26  9.999954e-01 5.049984e-32 2.045431e-45 4.594977e-06 1.944672e-23
## 27  9.996607e-01 4.548074e-27 1.615950e-39 3.393081e-04 3.147849e-17
## 28  9.963244e-01 1.197172e-25 8.018213e-40 3.675579e-03 1.787726e-14
## 29  9.999984e-01 7.350628e-34 4.082050e-49 1.560735e-06 1.889307e-21
## 30  1.962757e-04 2.624846e-12 6.756100e-22 9.998036e-01 8.236873e-08
## 31  9.997305e-01 1.383073e-28 2.594164e-42 2.695110e-04 4.432794e-18
## 32  9.999928e-01 2.270784e-28 2.787629e-41 7.167161e-06 3.324845e-15
## 33  9.999953e-01 7.851126e-30 3.060721e-42 4.700104e-06 4.077861e-18
## 34  9.999950e-01 4.669367e-35 9.494146e-50 4.989828e-06 1.202920e-23
## 35  9.999930e-01 2.582847e-34 3.175251e-50 7.019976e-06 1.379529e-23
## 36  9.857106e-01 1.258967e-28 7.307088e-44 1.428942e-02 1.778809e-19
## 37  9.999997e-01 1.377741e-40 1.783409e-58 3.409720e-07 4.309703e-28
## 38  9.995190e-01 9.441701e-27 6.643192e-40 4.809572e-04 4.610076e-17
## 39  9.999327e-01 2.069951e-27 6.422381e-39 6.734888e-05 9.359501e-18
## 40  9.997913e-01 7.700770e-28 2.055194e-40 2.087124e-04 1.956131e-18
## 41  9.377098e-01 6.951873e-24 3.263465e-37 6.229019e-02 8.273463e-16
## 42  9.999959e-01 5.145546e-35 7.090605e-49 4.067528e-06 5.442337e-23
## 43  9.998949e-01 5.222066e-30 1.641257e-45 1.051481e-04 2.196114e-20
## 44  9.999774e-01 1.902796e-30 4.419875e-45 2.262191e-05 9.482793e-18
## 45  9.997895e-01 2.194354e-26 3.507219e-38 2.104958e-04 7.406871e-17
## 46  1.250130e-03 3.094000e-12 1.218644e-19 9.987497e-01 1.321781e-07
## 47  9.941204e-01 2.130953e-22 6.274428e-35 5.879611e-03 3.595492e-11
## 48  1.000000e+00 4.481036e-41 3.427577e-54 2.309125e-09 8.927695e-30
## 49  6.142989e-41 1.308215e-02 9.869177e-01 4.273496e-22 1.137098e-07
## 50  1.297522e-42 6.577986e-03 9.934219e-01 1.160556e-22 1.051949e-07
## 51  1.578705e-41 1.352441e-02 9.864755e-01 3.012255e-22 4.088187e-08
## 52  2.780687e-46 2.817609e-03 9.971824e-01 8.130208e-26 1.190240e-08
## 53  1.965210e-36 3.697521e-01 6.300435e-01 4.351182e-19 2.044398e-04
## 54  2.997042e-42 1.018691e-02 9.898129e-01 3.249810e-23 1.522030e-07
## 55  4.181927e-45 2.672711e-03 9.973272e-01 9.518124e-25 4.213160e-08
## 56  3.526014e-37 1.214699e-02 9.878026e-01 4.426991e-20 5.044302e-05
## 57  1.338668e-46 1.454637e-03 9.985454e-01 7.274738e-26 9.730249e-09
## 58  1.699850e-45 5.035376e-04 9.994965e-01 9.077728e-26 1.202987e-08
## 59  1.544376e-48 7.195324e-04 9.992805e-01 3.337898e-27 4.984014e-09
## 60  7.005641e-49 7.327739e-04 9.992672e-01 1.793520e-27 1.418195e-10
## 61  1.186075e-44 8.129007e-05 9.999187e-01 8.050018e-26 4.364851e-08
## 62  5.141878e-44 3.937391e-04 9.996061e-01 1.385220e-24 1.365237e-07
## 63  1.242688e-38 1.477581e-02 9.852230e-01 3.238750e-21 1.141311e-06
## 64  1.157551e-35 3.116081e-02 9.625100e-01 9.513067e-19 6.329225e-03
## 65  4.866626e-43 1.033798e-02 9.896617e-01 9.974148e-24 3.327662e-07
## 66  8.396337e-40 1.330429e-02 9.866952e-01 2.118654e-21 4.909151e-07
## 67  4.769853e-38 9.282854e-02 9.071651e-01 8.693389e-20 6.340976e-06
## 68  8.179224e-40 1.652169e-02 9.834768e-01 1.518516e-21 1.496661e-06
## 69  1.609531e-47 7.720244e-04 9.992280e-01 1.297708e-26 4.568309e-09
## 70  6.402485e-40 8.025111e-03 9.919734e-01 7.869148e-22 1.538001e-06
## 71  4.720483e-41 3.787178e-02 9.621271e-01 3.564320e-22 1.094171e-06
## 72  1.353586e-41 1.177026e-02 9.882294e-01 1.901059e-22 2.988081e-07
## 73  2.896089e-44 3.943328e-03 9.960566e-01 3.198868e-24 6.858362e-08
## 74  2.663125e-39 1.474785e-02 9.852382e-01 1.794507e-21 1.397859e-05
## 75  7.645206e-40 4.212704e-02 9.578693e-01 1.247274e-21 3.663951e-06
## 76  1.389826e-41 2.122334e-02 9.787763e-01 4.098904e-22 3.360777e-07
## 77  5.364504e-49 4.444428e-04 9.995555e-01 9.412220e-28 1.725825e-08
## 78  3.832643e-42 2.989592e-03 9.970094e-01 2.209071e-23 1.032296e-06
## 79  2.502760e-43 2.252107e-03 9.977474e-01 5.245558e-24 4.454156e-07
## 80  9.896051e-44 4.711143e-03 9.952888e-01 5.469209e-24 8.453924e-08
## 81  7.606682e-42 1.040415e-02 9.895958e-01 5.702388e-23 5.665090e-08
## 82  1.846017e-35 7.516488e-02 9.247712e-01 4.066921e-18 6.394040e-05
## 83  4.333890e-35 2.859147e-02 9.713515e-01 2.552754e-18 5.699557e-05
## 84  1.315461e-38 8.849666e-04 9.991141e-01 2.807711e-21 8.893939e-07
## 85  5.626189e-48 9.757016e-04 9.990243e-01 7.517710e-27 2.028603e-09
## 86  1.450388e-44 3.073345e-03 9.969266e-01 1.087836e-24 4.627869e-08
## 87  4.060630e-41 9.468622e-03 9.905313e-01 3.702630e-22 1.187464e-07
## 88  3.292320e-35 2.218991e-01 7.777610e-01 4.536646e-18 3.398893e-04
## 89  7.586970e-41 2.580686e-02 9.741931e-01 5.535139e-22 7.027699e-10
## 90  5.477194e-57 2.474456e-05 9.999753e-01 6.632280e-33 1.937257e-12
## 91  5.920235e-40 4.258510e-02 9.574148e-01 1.173927e-20 1.463094e-07
## 92  8.826367e-47 1.435496e-03 9.985645e-01 6.224523e-26 2.221231e-09
## 93  1.873821e-41 4.303119e-02 9.569687e-01 7.010762e-22 7.179417e-08
## 94  6.000030e-40 6.649987e-03 9.933497e-01 2.017600e-21 3.246374e-07
## 95  3.509402e-40 6.936465e-03 9.930627e-01 1.437038e-21 8.720628e-07
## 96  1.209869e-42 1.346792e-02 9.865320e-01 5.348575e-23 7.173154e-08
## 97  5.032157e-46 5.427683e-04 9.994572e-01 3.490442e-26 1.298502e-08
## 98  4.483221e-46 6.083933e-03 9.939161e-01 7.916145e-26 9.045463e-09
## 99  6.393982e-47 2.860753e-04 9.997139e-01 5.635008e-27 8.059173e-10
## 100 2.884800e-41 1.348084e-02 9.865191e-01 4.075664e-22 1.386424e-08
## 101 5.797791e-40 6.581275e-03 9.934183e-01 2.336443e-21 4.150650e-07
## 102 1.129348e-43 6.715616e-04 9.993284e-01 8.822364e-25 8.101000e-09
## 103 1.519797e-44 8.560932e-04 9.991439e-01 7.026563e-25 8.534923e-09
## 104 1.072824e-41 5.228114e-02 9.477188e-01 5.364178e-22 3.979909e-08
## 105 9.474271e-42 1.204094e-01 8.795906e-01 9.528723e-22 1.027816e-08
## 106 2.099123e-38 2.756687e-02 9.724324e-01 2.237055e-20 7.554422e-07
## 107 1.500976e-41 1.894491e-03 9.981053e-01 6.075454e-23 1.646589e-07
## 108 7.582965e-41 1.350612e-02 9.864934e-01 4.886870e-22 5.112412e-07
## 109 1.159052e-41 4.273718e-03 9.957260e-01 4.612340e-23 3.014933e-07
## 110 1.236022e-39 1.937390e-02 9.806256e-01 3.710685e-21 5.307770e-07
## 111 9.363143e-36 9.675419e-02 9.031412e-01 6.230633e-18 1.046118e-04
## 112 2.521766e-48 1.019972e-03 9.989800e-01 6.347261e-27 2.842513e-10
## 113 2.366821e-40 1.106565e-02 9.889320e-01 1.833082e-22 2.380410e-06
## 114 2.139326e-46 3.787905e-03 9.962121e-01 9.783521e-26 1.222738e-08
## 115 3.000408e-35 1.185013e-01 8.814743e-01 2.358459e-17 2.437994e-05
## 116 1.029384e-31 5.310740e-01 4.686814e-01 6.615406e-15 2.445863e-04
## 117 5.020746e-41 3.798039e-03 9.962020e-01 1.254353e-21 2.158847e-09
## 118 6.276554e-41 1.591499e-03 9.984085e-01 2.212176e-22 2.948090e-08
## 119 1.436218e-35 1.420163e-01 8.579762e-01 1.685344e-17 7.559336e-06
## 120 3.911344e-42 1.287187e-02 9.871276e-01 3.385378e-22 5.154126e-07
## 121 1.233831e-31 4.156203e-01 5.841918e-01 9.209767e-15 1.878837e-04
## 122 4.068919e-33 2.809416e-01 7.190414e-01 2.107899e-16 1.695986e-05
## 123 9.430770e-38 7.999302e-03 9.919994e-01 2.645948e-20 1.284451e-06
## 124 1.103292e-02 5.879186e-17 1.192947e-28 9.889671e-01 6.307656e-10
## 125 7.166374e-01 1.052470e-17 1.134781e-28 2.833626e-01 8.496365e-10
## 126 3.165664e-05 9.877481e-12 5.947841e-21 9.999683e-01 7.329243e-08
## 127 1.991745e-07 2.927367e-06 2.817181e-14 9.996289e-01 3.679478e-04
## 128 6.533605e-02 5.842675e-12 1.311171e-20 9.332813e-01 1.382682e-03
## 129 3.469312e-06 1.888394e-06 8.393290e-14 9.999074e-01 8.722890e-05
## 130 8.970195e-09 9.679287e-06 2.452483e-13 9.981685e-01 1.821859e-03
## 131 2.275639e-03 5.457915e-15 8.692626e-26 9.977244e-01 1.748185e-11
## 132 3.270257e-06 1.950705e-10 4.857214e-20 9.999959e-01 8.606808e-07
## 133 3.176078e-05 1.040701e-12 3.836468e-24 9.999682e-01 8.824667e-08
## 134 3.298825e-03 7.199594e-15 6.320985e-26 9.967012e-01 1.984768e-09
## 135 9.518914e-01 2.632940e-21 2.847268e-36 4.810856e-02 2.472799e-14
## 136 4.953016e-05 6.464711e-11 9.715013e-21 9.999172e-01 3.328766e-05
## 137 9.964674e-06 3.523602e-10 7.007524e-19 9.999892e-01 8.236874e-07
## 138 8.661018e-03 1.190726e-16 9.876948e-30 9.913390e-01 5.021810e-12
## 139 1.220144e-01 1.628443e-19 1.188388e-31 8.779856e-01 4.956328e-12
## 140 1.712322e-05 5.046468e-13 7.357370e-24 9.999828e-01 2.801336e-08
## 141 5.561998e-03 3.963657e-13 6.883105e-23 9.944370e-01 9.740002e-07
## 142 1.187606e-06 8.391131e-09 3.476934e-18 9.999924e-01 6.441958e-06
## 143 4.530429e-06 1.241473e-09 1.254528e-17 9.999945e-01 9.935251e-07
## 144 5.407505e-06 1.202947e-08 2.195250e-15 9.999246e-01 7.001282e-05
## 145 4.378608e-01 5.659767e-17 6.626623e-28 5.621392e-01 3.153738e-10
## 146 5.662409e-03 2.288701e-13 1.466393e-22 9.943374e-01 2.095539e-07
## 147 6.646770e-28 9.381032e-01 1.623816e-02 6.034434e-13 4.565868e-02
## 148 1.173835e-26 9.945164e-01 3.479916e-03 8.240855e-12 2.003665e-03
## 149 7.663305e-26 9.706679e-01 1.809782e-02 2.895683e-11 1.123424e-02
## 150 1.238079e-30 8.641752e-01 1.325122e-01 1.838941e-13 3.312564e-03
## 151 5.767366e-33 6.369422e-01 3.630562e-01 3.563824e-14 1.614370e-06
## 152 5.805751e-32 9.601655e-01 3.959420e-02 8.816802e-15 2.402649e-04
## 153 9.003531e-28 9.855130e-01 1.447462e-02 1.218402e-11 1.241010e-05
## 154 4.549909e-25 9.976879e-01 7.171961e-04 5.546007e-11 1.594855e-03
## 155 8.140078e-28 9.625103e-01 2.640654e-02 9.354542e-13 1.108313e-02
## 156 3.997553e-24 9.441425e-01 5.008936e-04 8.358528e-10 5.535660e-02
## 157 4.181275e-32 9.807197e-01 1.386481e-02 1.449299e-15 5.415493e-03
## 158 1.392966e-25 9.925921e-01 4.324846e-04 4.466156e-11 6.975408e-03
## 159 5.019876e-25 9.143819e-01 2.522040e-03 1.671460e-11 8.309610e-02
## 160 2.376784e-25 9.997612e-01 7.702843e-05 1.192792e-10 1.617742e-04
## 161 5.323889e-28 9.961952e-01 7.510109e-04 3.396016e-13 3.053743e-03
## 162 3.875753e-28 9.659364e-01 3.074935e-02 7.218324e-13 3.314277e-03
## 163 2.454639e-28 9.480166e-01 5.163288e-02 1.145527e-11 3.505326e-04
## 164 2.594372e-29 8.550092e-01 1.156595e-01 3.511476e-14 2.933129e-02
## 165 7.641320e-34 9.847876e-01 1.485525e-02 6.358823e-17 3.571412e-04
## 166 3.704205e-35 1.365943e-01 8.628088e-01 3.094405e-18 5.968832e-04
## 167 1.717562e-34 8.096734e-01 1.902732e-01 3.575023e-17 5.342446e-05
## 168 9.282051e-24 8.225437e-01 1.702333e-02 2.902250e-09 1.604329e-01
## 169 1.201986e-24 9.891443e-01 7.447179e-05 1.002498e-09 1.078125e-02
## 170 7.511923e-21 9.754572e-01 1.870440e-03 6.162511e-08 2.267232e-02
## 171 1.137026e-09 1.722857e-02 2.237823e-06 4.974914e-02 9.330201e-01
## 172 3.967003e-21 2.031403e-01 8.326963e-07 2.978880e-08 7.968589e-01
## 173 3.451476e-21 6.894674e-03 1.118918e-05 9.228102e-10 9.930941e-01
## 174 1.085982e-18 1.924787e-02 2.196291e-06 1.826509e-07 9.807498e-01
## 175 1.468064e-18 9.428294e-03 2.116784e-07 5.769376e-08 9.905714e-01
## 176 2.686315e-17 3.708632e-04 9.186237e-08 3.602024e-07 9.996287e-01
## 177 2.286623e-41 1.355152e-01 8.644831e-01 7.959830e-22 1.681372e-06
## 178 1.931042e-36 1.751052e-01 8.248946e-01 9.192416e-18 1.790896e-07
## 179 4.617394e-44 1.834927e-03 9.981651e-01 8.407333e-24 8.859799e-09
## 180 1.299002e-49 2.287236e-03 9.977128e-01 2.354956e-28 2.339305e-10
## 181 4.567787e-44 5.530171e-02 9.446983e-01 2.017839e-23 6.715686e-09
## 
## $terms
## Genus..2. ~ LM + TD + TI + TP + LL + WD + WI + LI
## attr(,"variables")
## list(Genus..2., LM, TD, TI, TP, LL, WD, WI, LI)
## attr(,"factors")
##           LM TD TI TP LL WD WI LI
## Genus..2.  0  0  0  0  0  0  0  0
## LM         1  0  0  0  0  0  0  0
## TD         0  1  0  0  0  0  0  0
## TI         0  0  1  0  0  0  0  0
## TP         0  0  0  1  0  0  0  0
## LL         0  0  0  0  1  0  0  0
## WD         0  0  0  0  0  1  0  0
## WI         0  0  0  0  0  0  1  0
## LI         0  0  0  0  0  0  0  1
## attr(,"term.labels")
## [1] "LM" "TD" "TI" "TP" "LL" "WD" "WI" "LI"
## attr(,"order")
## [1] 1 1 1 1 1 1 1 1
## attr(,"intercept")
## [1] 1
## attr(,"response")
## [1] 1
## attr(,".Environment")
## <environment: R_GlobalEnv>
## attr(,"predvars")
## list(Genus..2., LM, TD, TI, TP, LL, WD, WI, LI)
## attr(,"dataClasses")
## Genus..2.        LM        TD        TI        TP        LL        WD 
##  "factor" "numeric" "numeric" "numeric" "numeric" "numeric" "numeric" 
##        WI        LI 
## "numeric" "numeric" 
## 
## $call
## lda(formula = Genus..2. ~ LM + TD + TI + TP + LL + WD + WI + 
##     LI, data = camels, prior = c(1, 1, 1, 1, 1)/5, CV = TRUE, 
##     na.action = "na.omit")
## 
## $xlevels
## named list()

Class:

Classification in the original dataset
####Posterior:
Probabilities of each sample being categorized within each group
####Terms:

# Assess the accuracy of the prediction
confusion(camels$Genus..2., knifed_camel$class,prior= c(1,1,1,1,1)/5)
##             Overall accuracy  Prior frequency.Aepycamelus 
##                       0.9551                       0.2000 
##     Prior frequency.Alforjas Prior frequency.Hemiauchenia 
##                       0.2000                       0.2000 
##  Prior frequency.Megatylopus   Prior frequency.Procamelus 
##                       0.2000                       0.2000 
## 
## Confusion matrix 
##               Predicted
## Actual         Aepycamelus Alforjas Hemiauchenia Megatylopus Procamelus
##   Aepycamelus       0.9167   0.0000       0.0000      0.0833          0
##   Alforjas          0.0000   0.9583       0.0417      0.0000          0
##   Hemiauchenia      0.0000   0.0125       0.9875      0.0000          0
##   Megatylopus       0.0870   0.0000       0.0000      0.9130          0
##   Procamelus        0.0000   0.0000       0.0000      0.0000          1

Iris Dataset

Fisher’s 1936 Paper
Ronald Fisher thinking about discriminant analysis

Now with the iris data set we can repeat what we did with the camelids and run a linear discriminant analysis on our data.
We can also plot our data using the ggord package again.

ldafit <- lda(Species ~ ., iris, prior = rep(1, 3)/3)
ldafit 
## Call:
## lda(Species ~ ., data = iris, prior = rep(1, 3)/3)
## 
## Prior probabilities of groups:
##     setosa versicolor  virginica 
##  0.3333333  0.3333333  0.3333333 
## 
## Group means:
##            Sepal.Length Sepal.Width Petal.Length Petal.Width
## setosa            5.006       3.428        1.462       0.246
## versicolor        5.936       2.770        4.260       1.326
## virginica         6.588       2.974        5.552       2.026
## 
## Coefficients of linear discriminants:
##                     LD1         LD2
## Sepal.Length  0.8293776  0.02410215
## Sepal.Width   1.5344731  2.16452123
## Petal.Length -2.2012117 -0.93192121
## Petal.Width  -2.8104603  2.83918785
## 
## Proportion of trace:
##    LD1    LD2 
## 0.9912 0.0088
irispredict<-predict(ldafit)
confusion(iris$Species, irispredict$class)
##           Overall accuracy     Prior frequency.setosa 
##                     0.9800                     0.3333 
## Prior frequency.versicolor  Prior frequency.virginica 
##                     0.3333                     0.3333 
## 
## Confusion matrix 
##             Predicted
## Actual       setosa versicolor virginica
##   setosa          1       0.00      0.00
##   versicolor      0       0.96      0.04
##   virginica       0       0.02      0.98
ggord(ldafit, iris$Species)

Quadratic Discriminant Analysis with 3 groups
Reminder: Quadratic Discriminant Analysis is used when we can’t assume homogeneity of our variance-covariance matrices
Again use the {Mass} package but this time use the qda() function

library(MASS)
qdafit <- qda(Species ~ Sepal.Length+Sepal.Width+Petal.Length+Petal.Width, iris, prior = rep(1,3)/3)
qdafit
## Call:
## qda(Species ~ Sepal.Length + Sepal.Width + Petal.Length + Petal.Width, 
##     data = iris, prior = rep(1, 3)/3)
## 
## Prior probabilities of groups:
##     setosa versicolor  virginica 
##  0.3333333  0.3333333  0.3333333 
## 
## Group means:
##            Sepal.Length Sepal.Width Petal.Length Petal.Width
## setosa            5.006       3.428        1.462       0.246
## versicolor        5.936       2.770        4.260       1.326
## virginica         6.588       2.974        5.552       2.026

Once again we get the prior probabilities of groups and group means

confusion(iris$Species, predict(qdafit)$class)
##           Overall accuracy     Prior frequency.setosa 
##                     0.9800                     0.3333 
## Prior frequency.versicolor  Prior frequency.virginica 
##                     0.3333                     0.3333 
## 
## Confusion matrix 
##             Predicted
## Actual       setosa versicolor virginica
##   setosa          1       0.00      0.00
##   versicolor      0       0.96      0.04
##   virginica       0       0.02      0.98

Some cool exploratory visualization tools

The {klaR} package allows us to look at how successful our classification methods are for every combination of 2 variables.

#Exploratory graph for LDA or QDA
library(klaR) 
partimat(Species ~ Sepal.Length+Sepal.Width+Petal.Length+Petal.Width,data=iris,method="qda")

Similarly we can use the pairs() function in the basic graphics package to look at linear combinations of our variables

pairs(iris[c("Sepal.Length","Sepal.Width","Petal.Length", "Petal.Width")], main="Scatterplot matrix for iris data", pch=22, 
   bg=c("red", "yellow", "blue")[unclass(iris$Species)])

Again, if we want to look at 3 linear discriminants at the same time 3D plots are useful.
The {scatterplot3d} package

library(scatterplot3d)
par(mfrow = c(2, 2))
mar0 = c(2, 3, 2, 3)

scatterplot3d(iris[, 1], iris[, 2], iris[, 3], mar = mar0, color = c("blue",
"black", "red")[iris$Species], pch = 19)
scatterplot3d(iris[, 2], iris[, 3], iris[, 4], mar = mar0, color = c("blue",
"black", "red")[iris$Species], pch = 19)
scatterplot3d(iris[, 3], iris[, 4], iris[, 1], mar = mar0, color = c("blue",
"black", "red")[iris$Species], pch = 19)
scatterplot3d(iris[, 4], iris[, 1], iris[, 2], mar = mar0, color = c("blue",
"black", "red")[iris$Species], pch = 19)

plot_ly(iris, x=iris[,1], y=iris[,2], z=iris[,3]) %>% #indicate what your axes are
  add_markers ( #add markers and a hover label
    text=~paste(paste("LD1:",iris[,1]), paste("LD2:", iris[,2]), paste("LD3:",iris[,3]) , paste("Species:", iris$Species), sep="<br />"), #what each element in your hover label will be and how you will separate them (if sep= "<br />" wasn't here they would all print on one long hover label)
    color=~iris$Species,symbol=I("circle"), size=I(6), hoverinfo="text") %>% #how you will define points (by genus), what colors you will use, the symbol and symbol size of each point, and what what info will be shown in the hover label (in this case it is what we just "text" as) 
  layout(scene=list(xaxis=list(title="LD1"),
                    yaxis=list(title="LD2"),
                    zaxis=list(title="LD3"))) #axes labels